744 research outputs found
NEATH II: NH as a tracer of imminent star formation in quiescent high-density gas
Star formation activity in molecular clouds is often found to be correlated
with the amount of material above a column density threshold of . Attempts to connect this column density threshold to a density above which star formation can occur are limited by the fact
that the volume density of gas is difficult to reliably measure from
observations. We post-process hydrodynamical simulations of molecular clouds
with a time-dependent chemical network, and investigate the connection between
commonly-observed molecular species and star formation activity. We find that
many molecules widely assumed to specifically trace the dense, star-forming
component of molecular clouds (e.g. HCN, HCO, CS) actually also exist in
substantial quantities in material only transiently enhanced in density, which
will eventually return to a more diffuse state without forming any stars. By
contrast, NH only exists in detectable quantities above a volume
density of , the point at which CO, which reacts
destructively with NH, begins to deplete out of the gas phase onto
grain surfaces. This density threshold for detectable quantities of NH
corresponds very closely to the volume density at which gas becomes
irreversibly gravitationally bound in the simulations: the material traced by
NH never reverts to lower densities, and quiescent regions of molecular
clouds with visible NH emission are destined to eventually form stars.
The NH line intensity is likely to directly correlate with the star
formation rate averaged over timescales of around a Myr.Comment: 10 pages, 10 figures. MNRAS accepte
Non-Equilibrium Abundances Treated Holistically (NEATH): the molecular composition of star-forming clouds
Much of what we know about molecular clouds, and by extension star formation,
comes from molecular line observations. Interpreting these correctly requires
knowledge of the underlying molecular abundances. Simulations of molecular
clouds typically only model species that are important for the gas
thermodynamics, which tend to be poor tracers of the denser material where
stars form. We construct a framework for post-processing these simulations with
a full time-dependent chemical network, allowing us to model the behaviour of
observationally-important species not present in the reduced network used for
the thermodynamics. We use this to investigate the chemical evolution of
molecular gas under realistic physical conditions. We find that molecules can
be divided into those which reach peak abundances at moderate densities () and decline sharply thereafter (such as CO and HCN), and
those which peak at higher densities and then remain roughly constant (e.g.
NH, NH). Evolving the chemistry with physical properties held
constant at their final values results in a significant overestimation of
gas-phase abundances for all molecules, and does not capture the drastic
variations in abundance caused by different evolutionary histories. The
dynamical evolution of molecular gas cannot be neglected when modelling its
chemistry.Comment: 14 pages, 13 figures. MNRAS accepte
Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy
Invasive electroencephalograph (EEG) recordings of ten patients suffering
from focal epilepsy were analyzed using the method of renormalized entropy.
Introduced as a complexity measure for the different regimes of a dynamical
system, the feature was tested here for its spatio-temporal behavior in
epileptic seizures. In all patients a decrease of renormalized entropy within
the ictal phase of seizure was found. Furthermore, the strength of this
decrease is monotonically related to the distance of the recording location to
the focus. The results suggest that the method of renormalized entropy is a
useful procedure for clinical applications like seizure detection and
localization of epileptic foci.Comment: 10 pages, 5 figure
Point process model of 1/f noise versus a sum of Lorentzians
We present a simple point process model of noise, covering
different values of the exponent . The signal of the model consists of
pulses or events. The interpulse, interevent, interarrival, recurrence or
waiting times of the signal are described by the general Langevin equation with
the multiplicative noise and stochastically diffuse in some interval resulting
in the power-law distribution. Our model is free from the requirement of a wide
distribution of relaxation times and from the power-law forms of the pulses. It
contains only one relaxation rate and yields spectra in a wide
range of frequency. We obtain explicit expressions for the power spectra and
present numerical illustrations of the model. Further we analyze the relation
of the point process model of noise with the Bernamont-Surdin-McWhorter
model, representing the signals as a sum of the uncorrelated components. We
show that the point process model is complementary to the model based on the
sum of signals with a wide-range distribution of the relaxation times. In
contrast to the Gaussian distribution of the signal intensity of the sum of the
uncorrelated components, the point process exhibits asymptotically a power-law
distribution of the signal intensity. The developed multiplicative point
process model of noise may be used for modeling and analysis of
stochastic processes in different systems with the power-law distribution of
the intensity of pulsing signals.Comment: 23 pages, 10 figures, to be published in Phys. Rev.
Dynamic Critical Behavior of the Swendsen-Wang Algorithm: The Two-Dimensional 3-State Potts Model Revisited
We have performed a high-precision Monte Carlo study of the dynamic critical
behavior of the Swendsen-Wang algorithm for the two-dimensional 3-state Potts
model. We find that the Li-Sokal bound ()
is almost but not quite sharp. The ratio seems to diverge
either as a small power () or as a logarithm.Comment: 35 pages including 3 figures. Self-unpacking file containing the
LaTeX file, the needed macros (epsf.sty, indent.sty, subeqnarray.sty, and
eqsection.sty) and the 3 Postscript figures. Revised version fixes a
normalization error in \xi (with many thanks to Wolfhard Janke for finding
the error!). To be published in J. Stat. Phys. 87, no. 1/2 (April 1997
Nonstationary random acoustic and electromagnetic fields as wave diffusion processes
We investigate the effects of relatively rapid variations of the boundaries
of an overmoded cavity on the stochastic properties of its interior acoustic or
electromagnetic field. For quasi-static variations, this field can be
represented as an ideal incoherent and statistically homogeneous isotropic
random scalar or vector field, respectively. A physical model is constructed
showing that the field dynamics can be characterized as a generalized diffusion
process. The Langevin--It\^{o} and Fokker--Planck equations are derived and
their associated statistics and distributions for the complex analytic field,
its magnitude and energy density are computed. The energy diffusion parameter
is found to be proportional to the square of the ratio of the standard
deviation of the source field to the characteristic time constant of the
dynamic process, but is independent of the initial energy density, to first
order. The energy drift vanishes in the asymptotic limit. The time-energy
probability distribution is in general not separable, as a result of
nonstationarity. A general solution of the Fokker--Planck equation is obtained
in integral form, together with explicit closed-form solutions for several
asymptotic cases. The findings extend known results on statistics and
distributions of quasi-stationary ideal random fields (pure diffusions), which
are retrieved as special cases.Comment: 54 pages, 8 figures, to appear in J. Phys. A: Math. Theo
Dynamic Critical Behavior of a Swendsen-Wang-Type Algorithm for the Ashkin-Teller Model
We study the dynamic critical behavior of a Swendsen-Wang-type algorithm for
the Ashkin--Teller model. We find that the Li--Sokal bound on the
autocorrelation time ()
holds along the self-dual curve of the symmetric Ashkin--Teller model, and is
almost but not quite sharp. The ratio appears
to tend to infinity either as a logarithm or as a small power (). In an appendix we discuss the problem of extracting estimates of
the exponential autocorrelation time.Comment: 59 pages including 3 figures, uuencoded g-compressed ps file.
Postscript size = 799740 byte
"Meaning" as a sociological concept: A review of the modeling, mapping, and simulation of the communication of knowledge and meaning
The development of discursive knowledge presumes the communication of meaning
as analytically different from the communication of information. Knowledge can
then be considered as a meaning which makes a difference. Whereas the
communication of information is studied in the information sciences and
scientometrics, the communication of meaning has been central to Luhmann's
attempts to make the theory of autopoiesis relevant for sociology. Analytical
techniques such as semantic maps and the simulation of anticipatory systems
enable us to operationalize the distinctions which Luhmann proposed as relevant
to the elaboration of Husserl's "horizons of meaning" in empirical research:
interactions among communications, the organization of meaning in
instantiations, and the self-organization of interhuman communication in terms
of symbolically generalized media such as truth, love, and power. Horizons of
meaning, however, remain uncertain orders of expectations, and one should
caution against reification from the meta-biological perspective of systems
theory
Markov analysis of stochastic resonance in a periodically driven integrate-fire neuron
We model the dynamics of the leaky integrate-fire neuron under periodic
stimulation as a Markov process with respect to the stimulus phase. This avoids
the unrealistic assumption of a stimulus reset after each spike made in earlier
work and thus solves the long-standing reset problem. The neuron exhibits
stochastic resonance, both with respect to input noise intensity and stimulus
frequency. The latter resonance arises by matching the stimulus frequency to
the refractory time of the neuron. The Markov approach can be generalized to
other periodically driven stochastic processes containing a reset mechanism.Comment: 23 pages, 10 figure
A composite approach to produce reference datasets for extratropical cyclone tracks: application to Mediterranean cyclones
Many cyclone detection and tracking methods (CDTMs) have been developed in the past to study the climatology of extratropical cyclones. However, all CDTMs have different approaches in defining and tracking cyclone centers. This naturally leads to cyclone track climatologies with inconsistent physical characteristics. More than that, it is typical for CDTMs to produce a non-negligible number of tracks of weak atmospheric features, which do not correspond to large-scale or mesoscale vortices and can differ significantly between CDTMs. Lack of consensus in CDTM outputs and the inclusion of significant numbers of uncertain tracks therein have long prohibited the production of a commonly accepted reference dataset of extratropical cyclone tracks. Such a dataset could allow comparable results on the analysis of storm track climatologies and could also contribute to the evaluation and improvement of CDTMs. To cover this gap, we present a new methodological approach that combines overlapping tracks from different CDTMs and produces composite tracks that concentrate the agreement of more than one CDTM. In this study we apply this methodology to the outputs of 10 well-established CDTMs which were originally applied to ERA5 reanalysis in the 42-year period of 1979-2020. We tested the sensitivity of our results to the spatiotemporal criteria that identify overlapping cyclone tracks, and for benchmarking reasons, we produced five reference datasets of subjectively tracked cyclones. Results show that climatological numbers of composite tracks are substantially lower than the ones of individual CDTMs, while benchmarking scores remain high (i.e., counting the number of subjectively tracked cyclones captured by the composite tracks). Our results show that composite tracks tend to describe more intense and longer-lasting cyclones with more distinguished early, mature and decay stages than the cyclone tracks produced by individual CDTMs. Ranking the composite tracks according to their confidence level (defined by the number of contributing CDTMs), it is shown that the higher the confidence level, the more intense and long-lasting cyclones are produced. Given the advantage of our methodology in producing cyclone tracks with physically meaningful and distinctive life stages, we propose composite tracks as reference datasets for climatological research in the Mediterranean. The Supplement provides the composite Mediterranean tracks for all confidence levels, and in the conclusion we discuss their adequate use for scientific research and applications
- …